Partners involved: UNIFI, UNIPI, UNISI, SSSA, Orthokey, Medea, WEART
Coordinator: Prof. Vincenzo Ferrari (UNIPI)
Outline: This sub-project is mainly focused on surgical and diagnostic devices.
X-Ray and endoscopic cameras allowed the introduction of Minimally Invasive Surgery (MIS) with tremendous advantages for the patient and for the entire society. Nowadays, new imaging modalities are required to increase the number of MIS interventions.
UNIPI aims to integrate and enhance the AOUP “Multidisciplinary Robotic Surgery Center of the Azienda Ospedaliera Universitaria Pisana” with the acquisition of 3D intraoperative imaging devices and the implementation of ad-hoc developed workflows to take full advantage of the information content offered by our 4 Da Vinci robots. Intraoperative imaging will allow robotic and image-guided techniques to unleash their full potential. This applies to all surgical disciplines, but this holds particularly true for specialties targeting soft tissues, usually subjected to significant non-rigid and unpredictable deformations during surgery. The imaging applications developed using the scanner will be at TRL7 (including clinical trials). It will be useful both for clinics and for engineering R&D.
The da Vinci ® Surgical System allows near-infrared (NIR) fluorescence imaging with indocyanine green (ICG) to be integrated into the surgical field. Indocyanine green is a diagnostic reagent that emits fluorescence after stimulation using a laser beam or NIR light at a wavelength ≥820 nm. UNIFI has already demonstrated that the intraoperative use of NIR imaging after submucosal injection of ICG around gastric tumors may improve the intraoperative visualization of lymph nodes and help to identify complete lymph node removal during robotic. However, the primary disadvantage of ICG is its lack of a tumor specific interaction mechanism as a passive fluorescent dye. To overcome this limitation, UNIFI aims at developing a new NIR fluorescence imaging by utilizing ICG labeled with the clinically nontoxic tumor-targeting peptide p28. Subcutaneous implantation of the NCI-N87 human gastric tumor cells into immunocompromised mice will be utilized as a platform to test the specific uptake of the ICG-p28 complex in gastric tumor cells and its imaging properties. We will use the data obtained from this murine model to design a surgical protocol involving patients undergoing surgery using the da Vinci ® Surgical System near-infrared (NIR) fluorescence imaging.
Miniaturization of surgical and diagnostic instruments are key aspects to improve the outcomes of clinical practices and to reduce patients’ trauma. Blending competencies coming from biorobotics and bioengineering, SSSA will develop a modular and flexible interventional platform composed of self-assembly reconfigurable multi-functional modules for early-stage diagnosis and minimally invasive intervention in the gastrointestinal and abdominal districts. Single modules will embed actuation (e.g. grippers), sensing (e.g. vision and force), and control/communication functionalities and, through smart mechatronics (e.g. magneto-electromechanical) interfaces, operation-driven medical platforms will be assembled in human’s cavities and lumens through hybrid accesses to execute teleoperated/assisted non-invasive medical interventions. Starting by feasibility studies performed in the last years by SSSA, we expect to reach a TRL 6 for each of the developed technology and a TRL 5 with the integrated interventional platform that will be validated in realistic in-vitro and ex-vivo simulators in collaboration with FTGM and AOUP surgeons.
Most of the interventions have yet to be performed with the traditional open approach because the manipulative task is too complicated, and it requires the dexterity and flexibility of the human hands. For this reason, the spoke will also work to empower manual surgery to reduce interventions invasiveness.
Surgical navigation based on augmented reality (AR) potentially supports and eases open surgery in different surgical specialties, the spoke will work to bring wearable AR surgical navigator to routine clinical practice. UNIPI has worked actively in the last years to address and solve some of the technological and human-factor limitations that still hamper the effective integration of these new devices into the surgical workflow. A dedicated activity will therefore focus on completing the final engineering stages required to bring to market a novel wearable AR surgical navigation platform. The platform includes a head mounted display (HMD) and a dedicated software platform specifically designed for surgical intraoperative guidance. A previous version of the AR platform for surgical navigation was developed and clinically assessed in maxillofacial surgery within the framework of the H2020 project coordinated by UNIPI (VOSTARS). Currently, the HMD is undergoing an engineering process also supported by a regional project (ARTS4.0 POR FESR 2014–2020). The existing prototype is paired with a SW platform that will be fully engineered by Orthokey to be compliant with the EU regulations for medical devices. The final platform will be then clinically tested in regional university hospitals in two different surgical specialties: orthopaedic surgery and neurosurgery.
UNISI and WEART will develop a system integrating a robotic arm with innovative grippers, interfaces, and control algorithms to assist surgeons in their operations. UNISI will focus on the development of highly versatile, modular, sterilizable collaborative grippers that can manipulate several different tools without needing complex tool-change procedures and, together with WEART, will also develop human-robot interfaces which will comprise a set of wearable devices capable of detecting user’s commands and of providing cutaneous haptic feedback signals encoding information on the robot and task state. The envisaged interfaces will create a bilateral human-robot communication channel and will be designed for and with surgeons (also considering virtual simulation and training) to ensure their ergonomics and usability. Collaborative grippers and human robot interfaces start from a TLR4 and are expected to reach a TRL 6 while the integrated system is expected to reach a TRL 5.
Regarding diagnosis, SSSA will develop, test, and validate telerobotic solutions for remote ultrasound diagnosis and remote control of clinical and pharmaceutical laboratories. We expect an advancement of TRL from 5 to 6 with testing and validation in the clinical environment of the technology through clinical evaluation and collaboration with Esaote as a relevant company in the field. The proposed robotic solution will enable advanced remote manipulation for the management of clinical laboratories and physical examination of patients, overcoming the actual limitations of telemedicine systems.
To cover all these different but complementary topics, the sub-project is divided into the following 7 activities:
- A9.1.a1 Training curricula based on patient-specific phantoms
- A9.1.a2 Acquisition and integration of the 3D scanner with the Da Vinci Robots in Pisa University Hospital
- A9.1.a3 Augmented Reality platform
- A9.1.a4 NIR fluorescence imaging validate on human
- A.9.1.a5 Telerobotic solutions for remote ultrasound diagnosis
- A9.1.a6 Integrated interventional platform for modular/flexible MIS
- A9.1.a7 Haptically augmented VR simulator for medical collaborative robotic grippers and interfaces